Multi-kernel regularized classifiers

نویسندگان

  • Qiang Wu
  • Yiming Ying
  • Ding-Xuan Zhou
چکیده

A family of classification algorithms generated from Tikhonov regularization schemes are considered. They involve multi-kernel spaces and general convex loss functions. Our main purpose is to provide satisfactory estimates for the excess misclassification error of these multi-kernel regularized classifiers. The error analysis consists of two parts: regularization error and sample error. Allowing multi-kernels in the algorithm improves the regularization error and approximation error, which is one advantage of the multi-kernel setting. For a general loss function, we show how to bound the regularization error by the approximation in some weighted L spaces. For the sample error, we use a projection operator. The projection in connection with the decay of the regularization error enables us to improve convergence rates in the literature even for the one kernel schemes and special loss functions: least square loss and hinge loss for support vector machine soft margin classifiers. Existence of the optimization problem for the regularization scheme associated with multi-kernels is verified when the kernel functions are continuous with respect to the index set. Gaussian kernels with flexible variances and probability distributions with some noise conditions are demonstrated to illustrate the general theory.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Least Squares Piecewise Multi-classification Machine

This paper presents a Tikhonov regularization based piecewise classification model for multi-category discrimination of sets or objects. The proposed model includes a linear classification and nonlinear kernel classification model formulation. Advantages of the regularized multi-classification formulations include its ability to express a multi-class problem as a single and unconstrained optimi...

متن کامل

Indoor Localization via Discriminatively Regularized Least Square Classification

In this paper, we address the received signal strength (RSS)-based indoor localization problem in a wireless local area network (WLAN) environment and formulate it as a multi-class classification problem using survey locations as classes. We present a discriminatively regularized least square classifier (DRLSC)-based localization algorithm that is aimed at making use of the class label informat...

متن کامل

Online Learning with Regularized Kernel for One-class Classification

This paper presents an online learning with regularized kernel based one-class extreme learning machine (ELM) classifier and is referred as “online RK-OC-ELM”. The baseline kernel hyperplane model considers whole data in a single chunk with regularized ELM approach for offline learning in case of one-class classification (OCC). Further, the basic hyper plane model is adapted in an online fashio...

متن کامل

The Margin Vector, Admissible Loss and Multi-class Margin-based Classifiers

We propose a new framework to construct the margin-based classifiers, in which the binary and multicategory classification problems are solved by the same principle; namely, margin-based classification via regularized empirical risk minimization. To build the framework, we propose the margin vector which is the multi-class generalization of the margin, then we further generalize the concept of ...

متن کامل

Classification with Kernel Mahalanobis Distance Classifiers

Within the framework of kernel methods, linear data methods have almost completely been extended to their nonlinear counterparts. In this paper, we focus on nonlinear kernel techniques based on the Mahalanobis distance. Two approaches are distinguished here. The first one assumes an invertible covariance operator, while the second one uses a regularized covariance. We discuss conceptual and exp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Complexity

دوره 23  شماره 

صفحات  -

تاریخ انتشار 2007